Web Survey Bibliography
In the last two decades, Web or Internet surveys have had a profound impact on the survey world. The change has been felt mostly strongly in the market research sector, with many companies switching from telephone surveys or other modes of data collection to online surveys. The academic and public policy/social attitude sectors were a little slower to adopt, being more careful about evaluating the effect of the change on key surveys and trends, and conducting research on how best to design and implement Web surveys. The public sector (i.e., government statistical offices) has been the slowest to embrace Web surveys, in part because the stakes are much higher, both in terms of the precision requirements of the estimates and in terms of the public scrutiny of such data. However, National Statistical Offices (NSOs) are heavily engaged in research and development with regard to Web surveys, mostly notably as part of a mixedmode data collection strategy, or in the establishment survey world, where repeated measurement and quick turnaround are the norm. Along with the uneven progress in the adoption of Web surveys has come a number of concerns about the method, particularly with regard to the representational or inferential aspects of Web surveys. At the same time, a great deal of research has been conducted on the measurement side of Web surveys, developing ways to improve the quality of data collected using this medium. This seminar focuses on these two key elements of Web surveys — inferential issues and measurement issues. Each of these broad areas will be covered in turn in the following sections. The inferential section is largely concerned with methods of sampling for Web surveys, and the associated coverage and nonresponse issues. Different ways in which samples are drawn, using both non-probability and probability-based approaches, are discussed. The assumptions behind the different approaches to inference in Web surveys, the benefits and risks inherent in the different approaches, and the appropriate use of particular approaches to sample selection in Web surveys, are reviewed. The following section then addresses a variety of issues related to the design of Web survey instruments, with a review of the empirical literature and practical recommendations for design to minimize measurement error.
A total survey error framework (see Deming, 1944; Kish, 1965; Groves, 1989) is useful for evaluating the quality or value of a method of data collection such as Web or Internet surveys. In this framework, there are several different sources of error in surveys, and these can be divided into two main groups: errors of non-observation and errors of observation. Errors of nonobservation refer to failures to observe or measure eligible members of the population of interest, and can include coverage errors, sampling errors, and nonresponse errors. Errors of nonobservation are primarily concerned about issues of selection bias. Errors of observation are also called measurement errors (see Biemer et al., 1991; Lessler and Kalsbeeck, 1992). Sources of measurement error include the respondent, the instrument, the mode of data collection and (in interviewer-administered surveys) the interviewer. In addition, processing errors can affect all types of surveys. Errors can also be classified according to whether they affect the variance or bias of survey estimates, both contributing to overall mean square error (MSE) of a survey statistic. A total survey error perspective aims to minimize mean square error for a set of survey statistics, given a set of resources. Thus, cost and time are also important elements in evaluating the quality of a survey. While Web surveys generally are significantly less expensive than other modes of data collection, and are quicker to conduct, there are serious concerns raised about errors of non-observation or selection bias. On the other hand, there is growing evidence that using Web surveys can improve the quality of the data collected (i.e., reduce measurement errors) relative to other modes, depending on how the instruments are designed. Given this framework, we first discuss errors of non-observation or selection bias that may raise concerns about the inferential value of Web surveys, particularly those targeted at the general population. Then in the second part we discuss ways that the design of the Web survey instrument can affect measurement errors.
EUSTAT Homepage (abstract) / (full text)
Web survey bibliography - Reports, seminars (231)
- Understanding Society Innovation Panel Wave 4: Results from Methodological Experiments; 2012; Burton, J., Budd, S., Kaminska, O., Uhrig, S. C. N., Brown, M., Calderwood, L.
- The Propensity of Older Respondents to Participate in a General Purpose Survey; 2012; Lynn, P.
- Mode-Switch Protocols: How a Seemingly Small Design Difference can affect Attrition Rates and Attrition...; 2012; Lynn, P.
- The Confirmit Annual Market Research Software Survey 2011; 2012; Macer, T., Wilson, S.
- Marktforschung mit dem iPad-Panel von Axel Springer Media Impact; 2012
- The Impact of Visual Design in Survey Cover Letters on Response and Web Take-Up Rates; 2012; Mockovak, W.
- Inventory of published research: Response burden measurement and reduction in official business statistics...; 2011; Giesen, D. & Snijkers, G. (Eds.), Bavdaz, M., Bergstrom, Y., Gravem, D. F., Haraldsen, G., Hedlin, D...
- Less questions, more data: Revitalizing the european currency in single source affluent audience measurement...; 2011; Hartman, H.
- Linking website exposure data to survey data: A single-source solution; 2011; Krahn, J., Landi, J., Melton, E.
- Inference in surveys with sequential mixed-mode data collection; 2011; Buelens, B., van der Brakel, J.
- Search and email still top the list of most popular online activities; 2011; Purcell, K.
- On the experience and evidence about mixing modes of data collection in large-scale surveys where the...; 2011; Dex, S., Gumy, J.
- What is Probit; 2011
- User agent; 2011
- Unpublisihed internal Google report on break off rates by device type; 2011; Callegaro, M.
- The impact of cookie deletion on site-server and ad-server metrics in Australia. An empirical comScore...; 2011
- State of mobile measurement; 2011; Gluck, M.
- SDSC Announces scalable, high-performance data storage cloud; 2011
- New Esomar survey on use of cookies and tracking technologies; 2011
- Mobile, webmail, desktops: Where are we viewing email now?; 2011
- Just published: Forrester Wave™ of enterprise feedback management satisfaction and loyalty solutions...; 2011; McInnes, A.
- ISER working paper 2011-31. Is it a good idea to optimise question format for mode of data collection...; 2011; Nicolaas, G., Campanelli, P., Hope, S., Jaeckle, A., Lynn, P.
- Internet access quarterly update 2011 Q1; 2011
- Households with Computers, Telephone Subscriptions, and Internet Access, Selected Years, 1997 - 2010; 2011
- GRE® program announces big benefits and big savings for GRE® test takers worldwide; 2011
- Google and Kantar develop measurement panel; 2011
- Global market research 2011; 2011
- Eurobarometer Special surveys: EB75.1 E-Communications Household Survey. Special Eurobarometer 362; 2011
- Causes of survey incompletes: Why panelists say they abandon surveys; 2011; Henning, J.
- Beyond data stability: Rising above quality concerns; 2011
- Background - QSOAP; 2011
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2011; 2011
- Web Survey Methodology: Interface Design, Sampling and Statistical Inference; 2011; Couper, M. P.
- Effect of interview modes on measurement of identity; 2011; Nandi, A., Platt, L.
- Maintaining Cross-Sectional Representativeness in a Longitudinal General Population Survey ; 2011; Lynn, P.
- Understanding Society Innovation Panel Wave 3: Results from Methodological Experiments; 2011; Burton, J., Budd, S., Gilbert, E., Jaeckle, A., McFall, S., Uhrig, S. C. N.
- The Effect of a Mixed Mode Wave on Subsequent Attrition in a Panel Survey: Evidence from the Understanding...; 2011; Lynn, P.
- Is it a good idea to optimise question format for mode of data collection? Results from a mixed modes...; 2011; Nicolaas, G., Campanelli, P., Hope, S., Lynn, P., Nandi, A.
- In the Face of Declining Budgets: The Student Experience at Washington State University ; 2011; Allen, T., Dillman, D. A., Garza, B., Millar, M. M.
- Framing Effects and Expected Social Security Claiming Behavior; 2011; Brown, Je., Kapteyn, A., Mitchell, O. S.
- Framing Effects and Expected Social Security Claiming Behavior; 2011; Brown, Je., Kapteyn, A., Mitchell, O. S.
- Computer Assisted Interview Testing Tool (CTT) - a review of new features and how the tool has improved...; 2010; Stark, R., Gatward, R.
- Address-based Sampling Nets Success for KnowledgePanel® Recruitment and Sample Representation; 2010; DiSogra, C.
- Mixed-Method Approaches to Social Network Analysis; 2010; Edwards, G.
- Measuring Intent to Participate and Participation in the 2010 Census and Their Correlates and Trends...; 2010; Pasek, J., Krosnick, J. A.
- What it takes to be a top 100 website; 2010
- The psychology or survey response. An ASA webinar; 2010; Tourangeau, R.
- Site-intercpet survey best practices; 2010; Henning, J.
- Real ID. State of The Art Representative and Repeatable Online Samples. Behaviorally Profiled Respondents...; 2010; Gittelman, S. H., Trimarchi, E.
- Overview of data collection methodology; 2010